Landauer's Principle, first argued in 1961[1] by Rolf Landauer of IBM, holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment". (Bennett 2003)[2].
The Landauer's Principle describes the Landauer limit, which is the minimum possible amount of energy required to change one bit of information, as follows:
At 25 °C (room temperature, or 298.15 kelvins), the Landauer limit represents an energy of approximately 0.0178 electron volt. Theoretically, room‑temperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second with only 2.85 trillionths of a watt of power being expended in the memory media.
If no information is erased, computation may in principle be achieved which is thermodynamically reversible, and require no release of heat. This has led to considerable interest in the study of reversible computing.
The principle is widely accepted as physical law; but in recent years it has been challenged, notably in Earman and Norton (1998), and subsequently in Shenker (2000)[3] and Norton (2004)[4], and then defended by Bennett (2003)[5] and Ladyman et al. (2007).[6]
Contents |
Landauer's principle can be understood to be a simple logical consequence of the second law of thermodynamics—which states that the entropy of a closed system cannot decrease—together with the definition of thermodynamic temperature. For, if the number of possible logical states of a computation were to decrease as the computation proceeded forward (logical irreversibility), this would constitute a forbidden decrease of entropy, unless the number of possible physical states corresponding to each logical state were to simultaneously increase by at least a compensating amount, so that the total number of possible physical states was no smaller than originally (total entropy has not decreased).
Yet an increase in the number of physical states corresponding to each logical state means that for an observer who is keeping track of the logical state of the system but not the physical state (for example an "observer" consisting of the computer itself), the number of possible physical states has increased; in other words, entropy has increased from the point of view of this observer. The maximum entropy of a bounded physical system is finite. (If the holographic principle is correct, then physical systems with finite surface area have a finite maximum entropy; but regardless of the truth of the holographic principle, quantum field theory dictates that the entropy of systems with finite radius and energy is finite.) So, to avoid reaching this maximum over the course of an extended computation, entropy must eventually be expelled to an outside environment at some given temperature T, requiring that energy E = ST must be emitted into that environment if the amount of added entropy is S. For a computational operation in which 1 bit of logical information is lost, the amount of entropy generated is at least k ln 2, and so the energy that must eventually be emitted to the environment is E ≥ kT ln 2.
In 2003 Weiss and Weiss, on a background of psychometric data and theoretical considerations, came to the conclusion that information processing by the brain has to be based on Landauer's principle. [7] In 2008 this has been empirically confirmed by a group of neurobiologists. [8]
This expression for the minimum energy dissipation from a logically irreversible binary operation was first suggested by John von Neumann, but it was first rigorously justified (and with important limits to its applicability stated) by Landauer. For this reason, it is sometimes referred to as being simply the Landauer bound or Landauer limit.